Following the “TRAnsparent and Comprehensive Ecological modelling documentation” framework suggested by Grimm et al. 2014.

1 Problem formulation

The decision-making context in which the model will be used; the types of model clients or stakeholders addressed; a precise specification of the question(s) that should be answered with the model, including a specification of necessary model outputs; and a statement of the domain of applicability of the model, including the extent of acceptable extrapolations.

Some illustrative picture to test figure integration in markdown!

Figure 1.1: Some illustrative picture to test figure integration in markdown!

1.1 Summary

1.2 Motivation

1.3 Questions

With this implementation of the model, we aim to answer the following questions: 1. How … ? explaination incl necessary model outputs

1.4 Use and applicability

The MGM model is suited to study species specific biomass production, growth processes and potential distributions of submerged macrophytes in lakes.

2 Model description

See ODD

The model, i.e. a detailed written model description. For individual/agent-based and other simulation models, the ODD protocol is recommended as standard format. For complex submodels, include concise explanations of the underlying rationale. Model users should learn what the model is, how it works, and what guided its design.

3 Data evaluation

The quality and sources of numerical and qualitative data used to parameterize the model, both directly and inversely via calibration, and of the observed patterns that were used to design the overall model structure. This critical evaluation will allow model users to assess the scope and the uncertainty of the data and knowledge on which the model is based.

4 Conceptual model evaluation

The simplifying assumptions underlying a model’s design, both with regard to empirical knowledge and general, basic principles. This critical evaluation allows model users to understand that model design was not ad hoc but based on carefullyscrutinized considerations.

5 Implementation verification

  1. Whether the computer code implementing the model has been thoroughly tested for programming errors, (2) whether the implemented model performs as indicated by the model description, and (3) how the software has been designed and documented to provide necessary usability tools (interfaces, automation of experiments, etc.) and to facilitate future installation, modification, and maintenance.

5.1 Biomass growth

5.2 Germination and mortaliy rate

5.2.1

6 Model output verification

  1. How well model output matches observations and (2) how much calibration and effects of environmental drivers were involved in obtaining good fits of model output and data.

7 Model analysis

  1. How sensitive model output is to changes in model parameters (sensitivity analysis), and (2) how well the emergence of model output has been understood.

8 Model output corroboration

How model predictions compare to independent data and patterns that were not used, and preferably not even known, while the model was developed, parameterized, and verified. By documenting model output corroboration, model users learn about evidence which, in addition to model output verification, indicates that the model is structurally realistic so that its predictions can be trusted to some degree.